Writing Style Conversion using Neural Machine Translation

نویسنده

  • Se Won
چکیده

Writing style is an important indicator of a writer’s persona. In the age of intelligent chatbots, writing style conversion can enable intimate human-AI interaction, allowing us to bridge the inherent gap between AI agents and human beings. In this paper, we apply sequence to sequence neural machine translation model with global attention mechanism to two writing style conversion tasks, mostly focusing on Shakespearean style conversion task, to explore its capabilities and limitations. In order to acquire parallel corpora of two unique writing styles, we suggest a new method of data acquisition, which leverages the Google Translator engine when only given a single corpus of target writing style. Another decision problem that is crucial to the performance of this model is the embedding matrix of the source and target vocabulary, hyperparameters, global attention mechanism, and many other details of bidirectional Sequence to Sequence model. The bidirectional Seq2Seq model we suggest here outperformed the previous models[1] for style mimicking in BLEU score by greater than 25%. In addition, through human evaluated metrics, we could observe that our bidirectional Seq2Seq model performed better than our simple attentive Seq2Seq model in preserving original meaning and imitating target style.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Comparative Study of English-Persian Translation of Neural Google Translation

Many studies abroad have focused on neural machine translation and almost all concluded that this method was much closer to humanistic translation than machine translation. Therefore, this paper aimed at investigating whether neural machine translation was more acceptable in English-Persian translation in comparison with machine translation. Hence, two types of text were chosen to be translated...

متن کامل

Zero-Shot Style Transfer in Text Using Recurrent Neural Networks

Zero-shot translation is the task of translating between a language pair where no aligned data for the pair is provided during training. In this work we employ a model that creates paraphrases which are written in the style of another existing text. Since we provide the model with no paired examples from the source style to the target style during training, we call this task zero-shot style tra...

متن کامل

Looking for Low-proficiency Sentences in ELL Writing

Determining whether an author is writing in their native language (L1) or a second language (L2) is a problem that lies at the intersection of four traditional NLP tasks: native language identification, similar language identification, detecting translationese, and grammatical error correction. In general, the goal of the language learner is to improve their proficiency until their writing is i...

متن کامل

A Simple and Strong Baseline: NAIST-NICT Neural Machine Translation System for WAT2017 English-Japanese Translation Task

This paper describes the details about the NAIST-NICT machine translation system for WAT2017 English-Japanese Scientific Paper Translation Task. The system consists of a language-independent tokenizer and an attentional encoder-decoder style neural machine translation model. According to the official results, our system achieves higher translation accuracy than any systems submitted previous ca...

متن کامل

Domain specialization: a post-training domain adaptation for Neural Machine Translation

Domain adaptation is a key feature in Machine Translation. It generally encompasses terminology, domain and style adaptation, especially for human postediting workflows in Computer Assisted Translation (CAT). With Neural Machine Translation (NMT), we introduce a new notion of domain adaptation that we call “specialization” and which is showing promising results both in the learning speed and in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017